Criteria for longitudinal data model selection based on Kullback's symmetric divergence
نویسندگان
چکیده
Recently, Azari et al (2006) showed that (AIC) criterion and its corrected versions cannot be directly applied to model selection for longitudinal data with correlated errors. They proposed two model selection criteria, AICc and RICc, by applying likelihood and residual likelihood approaches. These two criteria are estimators of the Kullback-Leibler’s divergence distance which is asymmetric. In this work, we apply the likelihood and residual likelihood approaches to propose two new criteria, suitable for small samples longitudinal data, based on the Kullback’s symmetric divergence. Their performance relative to others criteria is examined in a large simulation study. RÉSUMÉ. Récemment, Azari et al. (2006) ont montré que le critère (AIC) ainsi que ses versions corrigées ne peuvent pas être directement appliqués aux données longitudinales avec des erreurs corrélées. Ils ont proposé deux critères, AICc et RICc, en utilisant la notion de vraisemblance et de vraisemblance résiduelle. Leurs critères sont des estimations de la distance asymétrique de divergence de Kullback-Leibler. Dans ce travail, nous proposons deux nouveaux critères adaptés aux échantillons de petites tailles de données longitudinales en se basant sur la divergence symétrique de Kulback et les approches de maximum de vraisemblance et de vraisemblance résiduelle. Les performances de ces critères sont examinées dans une étude de simulations.
منابع مشابه
A Large - Sample Model Selection CriterionBased on Kullback ' s Symmetric
The Akaike information criterion, AIC, is a widely known and extensively used tool for statistical model selection. AIC serves as an asymptotically unbiased estimator of a variant of Kullback's directed divergence between the true model and a tted approximating model. The directed divergence is an asymmetric measure of separation between two statistical models, meaning that an alternate directe...
متن کاملCriteria for Linear Model
Model selection criteria frequently arise from constructing estimators of discrepancy measures used to assess the disparity between thètrue' model and a tted approximating model. The Akaike (1973) information criterion and its variants result from utilizing Kullback's (1968) directed divergence as the targeted discrepancy. The directed divergence is an asym-metric measure of separation between ...
متن کاملIs First-Order Vector Autoregressive Model Optimal for fMRI Data?
We consider the problem of selecting the optimal orders of vector autoregressive (VAR) models for fMRI data. Many previous studies used model order of one and ignored that it may vary considerably across data sets depending on different data dimensions, subjects, tasks, and experimental designs. In addition, the classical information criteria (IC) used (e.g., the Akaike IC (AIC)) are biased and...
متن کاملA model selection approach to signal denoising using Kullback's symmetric divergence
We consider the determination of a soft/hard coefficients threshold for signal recovery embedded in additive Gaussian noise. This is closely related to the problem of variable selection in linear regression. Viewing the denoising problem as a model selection one, we propose a new information theoretical model selection approach to signal denoising. We first construct a statistical model for the...
متن کاملA Bootstrap Model Selection Criterion Based on Kullback’s Symmetric Divergence
Following on recent work of [Cavanaugh, 1999] and [Seghouane, 2002], we propose a new corrected variant of KIC develop for the purpose of sources separation. Our variant utilizes bootstrapping to provide an estimate of the expected Kullback-Leibler symmetric divergence between the model generating the data and a fitted approximating model. Simulation results that illustrate the performance of t...
متن کامل